7 research outputs found

    Inputs and outputs: engagement in digital media from the maker's perspective

    Get PDF
    In the process of developing a technology assembly that can objectively measure engagement on a moment-by-moment basis, subjective responses to stimuli must be shown to correlate with the component technologies, such as motion capture or psychophysiology. Subjective scales for engagement are not all consistent in segregating the measurement of causes (inputs to the audience) and effects (outputs from the audience); this lack of separation can obscure appropriate inferences in the relationship between cause and effect. Inputs to the audience are scripted, and are controllable by the maker. An output is what the designed experience engenders in the end-user, and outputs can include both mental states (satisfaction) and physical activities (heart rate) during the stimulus and subsequently. Inputs can be maximised by design, whereas to optimise outputs from the end-user, one needs an empirical process because outputs are dependent upon an interpretive process or entry into a biological system. Outputs will be highly dependent on audience and context, and they will often be quite variable, even in individuals from a similar audience profile. It is critical that, in instruments assessing the relationship between inputs and outputs, controllable inputs to the end-user must not be conflated with outputs engendered in the end-user

    Using body language indicators for assessing the effects of soundscape quality on individuals

    Get PDF
    “Sounding Brighton” is a collaborative project exploring practical approaches towards better soundscapes focusing on soundscape issues related to health, quality of life and restorative functions of the environment. The project is part of a citywide engagement process working to provide opportunities to demonstrate how an applied soundscape approach might: tackle conventional noise problems, contribute to local planning and improve the environment in areas including urban green spaces, the built environment and traffic noise. So far, a soundscape map of the city has been developed, and a public outreach exhibition and conferences have taken place. One preliminary, experimental soundscape intervention in night noise has been analysed. This paper reports on further work to develop a better understanding of the effects of soundscapes on individual and community responses to soundscape through the use of body language indicators. Two-minute excerpts of aversive and preferred music were presented to 11 healthy volunteers in a motion-capture laboratory setting. Their responses were quantified computationally using motion-capture-derived parameters for position, absolute movement speed, and stillness. The prevalence of stillness of the head height (based on a 2 cm cut-off during 2-second sectors) was significantly lower when volunteers were exposed to unpleasant music compared to preferred music. This experiment provides proof in principle that changes in soundscape can be associated with subsequent, objective and statistically significant changes in body language that can be detected computationally

    Spatial data correlation: an interactive 3D visualisation tool for correlating the motion capture data streams from difference devices

    Get PDF
    We have developed an interactive, three-dimensional visualisation that plots individual motion capture data streams so they can be visually inspected concurrently to yield a unified, multimodal view of the complex problem. The primary objective in this study is to facilitate analysis of groups of putatively correlated time series with large numbers of data points. Our unique contribution is that we have developed a three-dimensional data visualisation tool with fly-through data capabilities, in order to distinguish false correlations from genuine correlations

    Comparing four technologies for measuring postural micromovements during monitor engagement

    No full text
    Objective metrics of engagement are valuable for estimating user experience or progression through interactional narratives. Postural micromovements of seated individuals during computer engagement have been previously measured with magnetic field sensors and chair-mounted force matrix detection mats. Here we compare readings from a head-mounted accelerometer, single camera sagittal motion tracking, and force distribution changes using floor-mounted force plates against a Vicon 8-camera motion capture system. Measurements were recorded on five participants who were watching or interacting with a computer monitor. Our results show that sagittal and coronal plane measurements for Vicon, the accelerometer and the single camera produced nearly identical data, were precisely synchronized in time, and in many cases proportional in amplitude. None of the systems tested were able to match the Vicon's measurement of yaw
    corecore